Skip to content
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.

Set Max Limit on number of executors to be launched in spark plugin #101

Closed
wants to merge 6 commits into from

Conversation

migueltol22
Copy link
Contributor

TL;DR

Allows configuration to set a max limit on spark executor by specifying spark.executor.instances:<max_limit> as part of the default spark configuration.

Type

  • Bug Fix
  • Feature
  • Plugin

Are all requirements met?

  • Code completed
  • Smoke tested
  • Unit tests added
  • Code documentation added
  • Any pending items have an associated Issue

Complete description

How did you fix the bug, make the feature etc. Link to any design docs etc

Tracking Issue

flyteorg/flyte#366

Follow-up issue

NA
OR
https://github.com/lyft/flyte/issues/

@codecov-commenter
Copy link

codecov-commenter commented Jun 25, 2020

Codecov Report

Merging #101 into master will decrease coverage by 0.00%.
The diff coverage is 58.82%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #101      +/-   ##
==========================================
- Coverage   59.76%   59.76%   -0.01%     
==========================================
  Files          95       95              
  Lines        5331     5348      +17     
==========================================
+ Hits         3186     3196      +10     
- Misses       1844     1848       +4     
- Partials      301      304       +3     
Impacted Files Coverage Δ
go/tasks/plugins/k8s/spark/spark.go 81.16% <58.82%> (-1.85%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update cf37142...90fc45c. Read the comment docs.


func populateSparkConfig(sparkConfig, userSparkConfig map[string]string) error {
for k, v := range userSparkConfig {
if k == "spark.executor.instances" && sparkConfig["spark.executor.instances"] != "" {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current behavior of the config value is to be the default value if the users don't provide one. This will change its behavior as the max.

Let's add a a separate section for limits ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated with separate section.

@migueltol22 migueltol22 requested a review from akhurana001 June 25, 2020 18:56

// Spark Limits config
type Limits struct {
ExecutorLimit string `json:"executor-limit" pflag:",Max limit on number of executors allowed to be launched."`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: would suggest renaming to ExecutorCountLimit or similar just to be more explicit in-case if we want to add memory or cpu limits as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

@akhurana001
Copy link
Contributor

looks good.

akhurana001
akhurana001 previously approved these changes Jul 8, 2020
@kumare3
Copy link
Contributor

kumare3 commented Aug 18, 2020

@migueltol22 is this required and are you merging it?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants